Council Bluffs
Data centers consume massive amounts of water – companies rarely tell the public exactly how much
As demand for artificial intelligence technology boosts construction and proposed construction of data centers around the world, those computers require not just electricity and land, but also a significant amount of water. Data centers use water directly, with cooling water pumped through pipes in and around the computer equipment. They also use water indirectly, through the water required to produce the electricity to power the facility. The amount of water used to produce electricity increases dramatically when the source is fossil fuels compared with solar or wind. A 2024 report from the Lawrence Berkeley National Laboratory estimated that in 2023, U.S. data centers consumed 17 billion gallons (64 billion liters) of water directly through cooling, and projects that by 2028, those figures could double - or even quadruple.
- North America > United States > Wisconsin > Milwaukee County > Milwaukee (0.06)
- North America > United States > Texas > Travis County > Pflugerville (0.05)
- North America > United States > Nevada > Storey County (0.05)
- (2 more...)
2 OLMo 2 Furious
OLMo, Team, Walsh, Pete, Soldaini, Luca, Groeneveld, Dirk, Lo, Kyle, Arora, Shane, Bhagia, Akshita, Gu, Yuling, Huang, Shengyi, Jordan, Matt, Lambert, Nathan, Schwenk, Dustin, Tafjord, Oyvind, Anderson, Taira, Atkinson, David, Brahman, Faeze, Clark, Christopher, Dasigi, Pradeep, Dziri, Nouha, Guerquin, Michal, Ivison, Hamish, Koh, Pang Wei, Liu, Jiacheng, Malik, Saumya, Merrill, William, Miranda, Lester James V., Morrison, Jacob, Murray, Tyler, Nam, Crystal, Pyatkin, Valentina, Rangapur, Aman, Schmitz, Michael, Skjonsberg, Sam, Wadden, David, Wilhelm, Christopher, Wilson, Michael, Zettlemoyer, Luke, Farhadi, Ali, Smith, Noah A., Hajishirzi, Hannaneh
We present OLMo 2, the next generation of our fully open language models. OLMo 2 includes dense autoregressive models with improved architecture and training recipe, pretraining data mixtures, and instruction tuning recipes. Our modified model architecture and training recipe achieve both better training stability and improved per-token efficiency. Our updated pretraining data mixture introduces a new, specialized data mix called Dolmino Mix 1124, which significantly improves model capabilities across many downstream task benchmarks when introduced via late-stage curriculum training (i.e. specialized data during the annealing phase of pretraining). Finally, we incorporate best practices from T\"ulu 3 to develop OLMo 2-Instruct, focusing on permissive data and extending our final-stage reinforcement learning with verifiable rewards (RLVR). Our OLMo 2 base models sit at the Pareto frontier of performance to compute, often matching or outperforming open-weight only models like Llama 3.1 and Qwen 2.5 while using fewer FLOPs and with fully transparent training data, code, and recipe. Our fully open OLMo 2-Instruct models are competitive with or surpassing open-weight only models of comparable size, including Qwen 2.5, Llama 3.1 and Gemma 2. We release all OLMo 2 artifacts openly -- models at 7B and 13B scales, both pretrained and post-trained, including their full training data, training code and recipes, training logs and thousands of intermediate checkpoints. The final instruction model is available on the Ai2 Playground as a free research demo.
- North America > United States > Minnesota > Hennepin County > Minneapolis (0.14)
- Asia > Middle East > Jordan (0.04)
- North America > United States > Texas > Travis County > Austin (0.04)
- (8 more...)
- Education (1.00)
- Energy (0.92)
- Information Technology > Services (0.46)
- Information Technology > Artificial Intelligence > Representation & Reasoning (1.00)
- Information Technology > Artificial Intelligence > Natural Language > Large Language Model (1.00)
- Information Technology > Artificial Intelligence > Natural Language > Chatbot (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning (1.00)